136 research outputs found

    Computer-Assisted Liver Surgery: from preoperative 3D patient modelling to peroperative guidance

    Get PDF
    La chirurgie représente le meilleur taux de survie pour les cancers hépatiques. Le traitement d’images médicales peut apporter une importante amélioration dans la prise en charge en guidant le geste chirurgical. Nous présentons ici une nouvelle procédure chirurgicale assistée par ordinateur incluant la modélisation 3D préopératoire du patient, suivie par une planification chirurgicale virtuelle et finalisée par un guidage peropératoire réalisé par réalité augmentée (RA). Les premières évaluations incluant des applications cliniques valident le bénéfice attendu. La prochaine étape consistera à automatiser le système de réalité augmentée peropératoire par le développement d’une salle d’opération hybride.Surgery has the best survival rate in hepatic cancer. However, such interventions cannot be undertaken for all patients as the eligibility rules for liver surgery lack accuracy and may include many exceptions. Medical image processing can lead to a major improvement of patient care by guiding the surgical gesture. We present here a new computer-assisted surgical procedure including preoperative 3D patient modelling, followed by virtual surgical planning and finalized by intraoperative computer guidance through the use of augmented reality. First evaluations including the clinical application validate the awaited benefit. The next step will consist in automating the intraoperative augmented reality system thanks to the development of a Hybrid surgical OP-room

    Multiple Synchronous Squamous Cell Cancers of the Skin and Esophagus: Differential Management of Primary Versus Secondary Tumor

    Get PDF
    Multiple primary tumors are uncommon in patients with squamous cell esophageal cancer. Conventional imaging methods have limitations in detecting those tumors. Although 18-F-fluoro-deoxyglucose-positron emission tomography scanner increases the detection of multiple synchronous tumors in patients with other malignancies, its contribution in patients with squamous cell esophageal cancer has not been assessed as it is not systematically performed. The detection of synchronous skin squamous cell tumors in patients with squamous cell esophageal cancer presents a challenge for making diagnostic and therapeutic decisions. A metastatic tumor leads to palliative management, whereas the diagnosis of a primary skin tumor requires curative treatment of both squamous cell tumors. Pathological evaluation appears crucial in the decision

    Proposal and multicentric validation of a laparoscopic Roux-en-Y gastric bypass surgery ontology.

    Get PDF
    BACKGROUND Phase and step annotation in surgical videos is a prerequisite for surgical scene understanding and for downstream tasks like intraoperative feedback or assistance. However, most ontologies are applied on small monocentric datasets and lack external validation. To overcome these limitations an ontology for phases and steps of laparoscopic Roux-en-Y gastric bypass (LRYGB) is proposed and validated on a multicentric dataset in terms of inter- and intra-rater reliability (inter-/intra-RR). METHODS The proposed LRYGB ontology consists of 12 phase and 46 step definitions that are hierarchically structured. Two board certified surgeons (raters) with > 10 years of clinical experience applied the proposed ontology on two datasets: (1) StraBypass40 consists of 40 LRYGB videos from Nouvel Hôpital Civil, Strasbourg, France and (2) BernBypass70 consists of 70 LRYGB videos from Inselspital, Bern University Hospital, Bern, Switzerland. To assess inter-RR the two raters' annotations of ten randomly chosen videos from StraBypass40 and BernBypass70 each, were compared. To assess intra-RR ten randomly chosen videos were annotated twice by the same rater and annotations were compared. Inter-RR was calculated using Cohen's kappa. Additionally, for inter- and intra-RR accuracy, precision, recall, F1-score, and application dependent metrics were applied. RESULTS The mean ± SD video duration was 108 ± 33 min and 75 ± 21 min in StraBypass40 and BernBypass70, respectively. The proposed ontology shows an inter-RR of 96.8 ± 2.7% for phases and 85.4 ± 6.0% for steps on StraBypass40 and 94.9 ± 5.8% for phases and 76.1 ± 13.9% for steps on BernBypass70. The overall Cohen's kappa of inter-RR was 95.9 ± 4.3% for phases and 80.8 ± 10.0% for steps. Intra-RR showed an accuracy of 98.4 ± 1.1% for phases and 88.1 ± 8.1% for steps. CONCLUSION The proposed ontology shows an excellent inter- and intra-RR and should therefore be implemented routinely in phase and step annotation of LRYGB

    Weakly Supervised Temporal Convolutional Networks for Fine-grained Surgical Activity Recognition

    Full text link
    Automatic recognition of fine-grained surgical activities, called steps, is a challenging but crucial task for intelligent intra-operative computer assistance. The development of current vision-based activity recognition methods relies heavily on a high volume of manually annotated data. This data is difficult and time-consuming to generate and requires domain-specific knowledge. In this work, we propose to use coarser and easier-to-annotate activity labels, namely phases, as weak supervision to learn step recognition with fewer step annotated videos. We introduce a step-phase dependency loss to exploit the weak supervision signal. We then employ a Single-Stage Temporal Convolutional Network (SS-TCN) with a ResNet-50 backbone, trained in an end-to-end fashion from weakly annotated videos, for temporal activity segmentation and recognition. We extensively evaluate and show the effectiveness of the proposed method on a large video dataset consisting of 40 laparoscopic gastric bypass procedures and the public benchmark CATARACTS containing 50 cataract surgeries
    • …
    corecore